Memristive Stochastic Computing for Deep Learning Parameter Optimization

نویسندگان

چکیده

Stochastic Computing (SC) is a computing paradigm that allows for the low-cost and low-power computation of various arithmetic operations using stochastic bit streams digital logic. In contrast to conventional representation schemes used within binary domain, sequence in domain inconsequential, usually non-deterministic. this brief, we exploit stochasticity during switching probabilistic Conductive Bridging RAM (CBRAM) devices efficiently generate order perform Deep Learning (DL) parameter optimization, reducing size Multiply Accumulate (MAC) units by 5 orders magnitude. We demonstrate 40-nm Complementary Metal Oxide Semiconductor (CMOS) process our scalable architecture occupies 1.55mm$^2$ consumes approximately 167$\mu$W when optimizing parameters Convolutional Neural Network (CNN) while it being trained character recognition task, observing no notable reduction accuracy post-training.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed stochastic optimization for deep learning

We study the problem of how to distribute the training of large-scale deep learning models in the parallel computing environment. We propose a new distributed stochastic optimization method called Elastic Averaging SGD (EASGD). We analyze the convergence rate of the EASGD method in the synchronous scenario and compare its stability condition with the existing ADMM method in the round-robin sche...

متن کامل

Distributed stochastic optimization for deep learning (thesis)

We study the problem of how to distribute the training of large-scale deep learning models in the parallel computing environment. We propose a new distributed stochastic optimization method called Elastic Averaging SGD (EASGD). We analyze the convergence rate of the EASGD method in the synchronous scenario and compare its stability condition with the existing ADMM method in the round-robin sche...

متن کامل

Stochastic Memristive Devices for Computing and Neuromorphic Applications

Nanoscale resistive switching devices (memristive devices or memristors) have been studied for a number of applications ranging from non-volatile memory, logic to neuromorphic systems. However a major challenge is to address the potentially large variations in space and time in these nanoscale devices. Here we show that in metal-filament based memristive devices the switching can be fully stoch...

متن کامل

A Hybrid Optimization Algorithm for Learning Deep Models

Deep learning is one of the subsets of machine learning that is widely used in Artificial Intelligence (AI) field such as natural language processing and machine vision. The learning algorithms require optimization in multiple aspects. Generally, model-based inferences need to solve an optimized problem. In deep learning, the most important problem that can be solved by optimization is neural n...

متن کامل

A Hybrid Optimization Algorithm for Learning Deep Models

Deep learning is one of the subsets of machine learning that is widely used in Artificial Intelligence (AI) field such as natural language processing and machine vision. The learning algorithms require optimization in multiple aspects. Generally, model-based inferences need to solve an optimized problem. In deep learning, the most important problem that can be solved by optimization is neural n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Circuits and Systems Ii-express Briefs

سال: 2021

ISSN: ['1549-7747', '1558-3791']

DOI: https://doi.org/10.1109/tcsii.2021.3065932